AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Serbian language optimization

# Serbian language optimization

Yugo55a 4bit
MIT
Yugo55A-GPT is a Serbian-optimized large language model merged from multiple excellent models, demonstrating outstanding performance in Serbian LLM evaluations.
Large Language Model Transformers Other
Y
datatab
47
1
Tito 7B Slerp
Apache-2.0
Tito-7B-slerp is a large language model created by merging the YugoGPT and AlphaMonarch-7B models using the mergekit tool, excelling in Serbian and English tasks.
Large Language Model Transformers
T
Stopwolf
22
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase